Smoothed functional (SF) schemes for gradient estimation are known to beefficient in stochastic optimization algorithms, specially when the objectiveis to improve the performance of a stochastic system. However, the performanceof these methods depends on several parameters, such as the choice of asuitable smoothing kernel. Different kernels have been studied in literature,which include Gaussian, Cauchy and uniform distributions among others. Thispaper studies a new class of kernels based on the q-Gaussian distribution, thathas gained popularity in statistical physics over the last decade. Though theimportance of this family of distributions is attributed to its ability togeneralize the Gaussian distribution, we observe that this class encompassesalmost all existing smoothing kernels. This motivates us to study SF schemesfor gradient estimation using the q-Gaussian distribution. Using the derivedgradient estimates, we propose two-timescale algorithms for optimization of astochastic objective function in a constrained setting with projected gradientsearch approach. We prove the convergence of our algorithms to the set ofstationary points of an associated ODE. We also demonstrate their performancenumerically through simulations on a queuing model.
展开▼